Smoothed Analysis of Gaussian Elimination

نویسندگان

  • Arvind Sankar
  • Rodolfo Ruben Rosales
چکیده

We present a smoothed analysis of Gaussian elimination, both with partial pivoting and without pivoting. Let A be any matrix and let A be a slight random perturbation of A. We prove that it is unlikely that A has large condition number. Using this result, we prove it is unlikely that A has large growth factor under Gaussian elimination without pivoting. By combining these results, we bound the smoothed precision needed to perform Gaussian elimination without pivoting. Our results improve the average-case analysis of Gaussian elimination without pivoting performed by Yeung and Chan (SIAM J. Matrix Anal. Appl., 1997). We then extend the result on the growth factor to the case of partial pivoting, and present the first analysis of partial pivoting that gives a sub-exponential bound on the growth factor. In particular, we show that if the random perturbation is Gaussian with variance or2, then the growth factor is bounded by (n/cr)(logn) with very high probability. Thesis Supervisor: Daniel A. Spielman Title: Associate Professor of Applied Mathematics

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Smoothed Analysis of the Condition Numbers and Growth Factors of Matrices

Let A be any matrix and let A be a slight random perturbation of A. We prove that it is unlikely that A has large condition number. Using this result, we prove it is unlikely that A has large growth factor under Gaussian elimination without pivoting. By combining these results, we bound the smoothed precision needed by Gaussian elimination without pivoting. Our results improve the average-case ...

متن کامل

Seismic modeling using the frozen Gaussian approximation

We adopt the frozen Gaussian approximation (FGA) for modeling seismic waves. The method belongs to the category of ray-based beam methods. It decomposes seismic wavefield into a set of Gaussian functions and propagates these Gaussian functions along appropriate ray paths. As opposed to the classic Gaussian-beam method, FGA keeps the Gaussians frozen (at a fixed width) during the propagation pro...

متن کامل

From Almost Gaussian to Gaussian Bounding Differences of Differential Entropies

We consider lower and upper bounds on the difference of differential entropies of a Gaussian random vector and an almost Gaussian random vector after both are “smoothed” by an arbitrarily distributed random vector of finite power. These bounds are important to prove the optimality of corner points in the capacity region of Gaussian interference channels. The upper bound, presented in MaxEnt-201...

متن کامل

Smoothed Action Value Functions for Learning Gaussian Policies

State-action value functions (i.e., Q-values) are ubiquitous in reinforcement learning (RL), giving rise to popular algorithms such as SARSA and Qlearning. We propose a new notion of action value defined by a Gaussian smoothed version of the expected Q-value. We show that such smoothed Q-values still satisfy a Bellman equation, making them learnable from experience sampled from an environment. ...

متن کامل

CSC 2530 Assignment: Panoramic Mosaic Team Member:

This is a report on panoramic mosaic for a CSC2530 assignment. It uses a translation-only image registration technique proposed by Lucas and Kanade [2]. A coarse-to-fine method is used to produce better estimates. Images are put through the following pipeline. They are first warped into cylindrical coordinates, and then, smoothed with a 5 by 5 Gaussian filter with a standard deviation of 1.0. F...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2004